Web Survey Bibliography
New data collection technologies make it possible to combine many benefits of interviewer and self-administration. For example, a webbased questionnaire can offer clarification to a respondent who gives evidence of confusion. A natural extension of this process is the introduction of virtual or animated interviewing agents into computerized questionnaires: graphical, moving entities in the user interface that ask questions, record answers and potentially do much more. The proposed talk reports a laboratory experiment in which animated interviewing agents asked (spoke) questions about ordinary non-sensitive behaviors and 73 respondents answered (by speaking) based on fictional scenarios (Schober & Conrad, 1997). Our main question is whether response accuracy is affected by how realistic the agent looks (amount of facial and head movement) and how capably it can converse with a respondent (ability to clarify questions when it seems this might help). The interviewing agent assigned to any one respondent was either high or low in ‘visual realism’ and high or low in ‘dialogue capability.’ Half of the scenarios were designed to be ambiguous without clarification. Looking just at these cases, respondents were approximately 30% more accurate when the agent was high in dialogue capability than when it was low. However there was no impact of visual realism. Respondents looked at the agent 20-30% of the time – long enough to perceive its visual attributes and, in fact, respondents’ ratings of the agent were affected by its visual realism as was the way they interacted with the agent. Yet high visual realism did not increase respondents’ requests for clarification – one action that could have improved response accuracy. Interviewing agents asking non-sensitive questions will produce better data if they can converse intelligently; however, more realistic-looking agents might help in ways not studied here, e.g. motivating potential respondents to participate and complete questionnaires.
Homepage (abstract)
Web survey bibliography - Schober, M. F. (14)
- Respondent mode choice in a smartphone survey ; 2017; Conrad, F. G., Schober, M. F., Antoun, C., Yan, H. Y., Hupp, A., Johnston, M., Ehlen, P., Vickers, L...
- Comparisons of Online Recruitment Strategies for Convenience Samples: Craigslist, Google AdWords, Facebook...; 2016; Antoun, C., Zhang, C., Conrad, F. G., Schober, M. F.
- Comprehension and engagement in survey interviews with virtual agents; 2016; Conrad, F. G.; Schober, M. F.; Jans, M.; Orlowski, R. A.; Nielsen, D.; Levenstein, R. M.
- Social Media Analyses for Social Measurement; 2016; Schober, M. F.; Pasek, J.; Guggenheim, L.; Lampe, C.; Conrad, F. G.
- Mobile Technologies for Conducting, Augmenting and Potentially Replacing Surveys: Report of the AAPOR...; 2014; Link, M. W., Murphy, J., Schober, M. F., Buskirk, T. D., Childs, J. H., Tesfaye, C.
- Effects of Self-Awareness on Disclosure During Skype Survey Interviews; 2013; Feuer, S., Schober, M. F.
- Disfluencies and Gaze Aversion in Unreliable Responses to Survey Questions; 2012; Schober, M. F., Conrad, F. G., Dijkstra, W., Ongena, Y. P.
- Race-of-Virtual-Interviewer Effects; 2011; Conrad, F. G., Schober, M. F., Nielsen, D.
- Which Web Survey Respondents Are Most Likely to Click for Clarification?; 2011; Coiner, T., Schober, M. F., Conrad, F. G.
- Envisioning the Survey Interview of the Future ; 2009; Conrad, F. G., Schober, M. F.
- Social Cues Can Affect Answers to Threatening Questions in Virtual Interviews; 2008; Lind, L. H., Schober, M. F., Conrad, F. G.
- Virtual Interviews on Mundane, Non-Sensitive Topics: Dialog Capability Affects Response Accuracy More...; 2008; Conrad, F. G., Schober, M. F., Jans, M., Orlowski, R. A., Nielsen, D.
- Surveys interviews and new communication technologies; 2007; Schober, M. F., Conrad, F. G.
- Promoting Uniform Question Understanding in Today's and Tomorrow's Surveys; 2005; Conrad, F. G., Schober, M. F.